Vapnik-Chervonenkis Generalization Bounds for Real Valued Neural Networks
نویسندگان
چکیده
منابع مشابه
Can Neural Networks Do Better Than the Vapnik-Chervonenkis Bounds?
Gerald Tesauro IBM Watson Research Center P.O. Box 704 Yorktown Heights, NY 10598 \Ve describe a series of careful llumerical experiments which measure the average generalization capability of neural networks trained on a variety of simple functions. These experiments are designed to test whether average generalization performance can surpass the worst-case bounds obtained from formal learning ...
متن کاملMaking Vapnik-Chervonenkis bounds accurate
This chapter shows how returning to the combinatorial nature of the Vapnik-Chervonenkis bounds provides simple ways to increase their accuracy, take into account properties of the data and of the learning algorithm, and provide empirically accurate estimates of the deviation between training error and testing error.
متن کاملDistribution-Dependent Vapnik-Chervonenkis Bounds
Vapnik-Chervonenkis (VC) bounds play an important role in statistical learning theory as they are the fundamental result which explains the generalization ability of learning machines. There have been consequent mathematical works on the improvement of VC rates of convergence of empirical means to their expectations over the years. The result obtained by Talagrand in 1994 seems to provide more ...
متن کاملVapnik-Chervonenkis Dimension of Recurrent Neural Networks
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimension of such networks. Several types of activation functions are discussed, including threshold, po...
متن کاملVapnik-chervonenkis Dimension 1 Vapnik-chervonenkis Dimension
Valiant’s theorem from the previous lecture is meaningless for infinite hypothesis classes, or even classes with more than exponential size. In 1968, Vladimir Vapnik and Alexey Chervonenkis wrote a very original and influential paper (in Russian) [5, 6] which allows us to estimate the sample complexity for infinite hypothesis classes too. The idea is that the size of the hypothesis class is a p...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Computation
سال: 1996
ISSN: 0899-7667,1530-888X
DOI: 10.1162/neco.1996.8.6.1277